A Simple Method For Estimating Conditional Probabilities For SVMs
نویسنده
چکیده
Support Vector Machines (SVMs) have become a popular learning algorithm, in particular for large, high-dimensional classification problems. SVMs have been shown to give most accurate classification results in a variety of applications. Several methods have been proposed to obtain not only a classification, but also an estimate of the SVMs confidence in the correctness of the predicted label. In this paper, several algorithms are compared which scale the SVM decision function to obtain an estimate of the conditional class probability. A new simple and fast method is derived from theoretical arguments and empirically compared to the existing approaches.
منابع مشابه
Sparseness vs Estimating Conditional Probabilities: Some Asymptotic Results
One of the nice properties of kernel classifiers such as SVMs is that they often produce sparse solutions. However, the decision functions of these classifiers cannot always be used to estimate the conditional probability of the class label. We investigate the relationship between these two properties and show that these are intimately related: sparseness does not occur when the conditional pro...
متن کاملSparseness Versus Estimating Conditional Probabilities: Some Asymptotic Results
One of the nice properties of kernel classifiers such as SVMs is that they often produce sparse solutions. However, the decision functions of these classifiers cannot always be used to estimate the conditional probability of the class label. We investigate the relationship between these two properties and show that these are intimately related: sparseness does not occur when the conditional pro...
متن کاملAdaptive Nearest Neighbor Classi cation using
The nearest neighbor technique is a simple and appealing method to address classi-cation problems. It relies on the assumption of locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with a nite number of examples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. The empl...
متن کاملEfficient Local Flexible Nearest Neighbor Classification
The nearest neighbor technique is a simple and appealing method to address classification problems. It relies on the assumption of locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with a finite number of examples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. The e...
متن کاملEfficient Simulation of a Random Knockout Tournament
We consider the problem of using simulation to efficiently estimate the win probabilities for participants in a general random knockout tournament. Both of our proposed estimators, one based on the notion of “observed survivals” and the other based on conditional expectation and post-stratification, are highly effective in terms of variance reduction when compared to the raw simulation estimato...
متن کامل